Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dependency] sdk/log/xlog: Add FilterProcessor and EnabledParameters #6271

Closed
wants to merge 25 commits into from

Conversation

pellared
Copy link
Member

@pellared pellared commented Feb 5, 2025

This introduces a new go.opentelemetry.io/otel/sdk/log/xlog module.
I created it because of two reasons:

The SDK depends on xlog using. However, the SDK does not expose anything from xlog (apart for documentation). xreceiver has a similar design. More: #6271 (comment)

goos: linux
goarch: amd64
pkg: go.opentelemetry.io/otel/sdk/log
cpu: 13th Gen Intel(R) Core(TM) i7-13800H
                 │   old.txt   │                 new.txt                  │
                 │   sec/op    │     sec/op      vs base                  │
LoggerEnabled-20   4.589n ± 1%   328.000n ± 10%  +7047.53% (p=0.000 n=10)

                 │   old.txt    │            new.txt             │
                 │     B/op     │     B/op      vs base          │
LoggerEnabled-20   0.000Ki ± 0%   1.124Ki ± 9%  ? (p=0.000 n=10)

                 │  old.txt   │            new.txt             │
                 │ allocs/op  │ allocs/op   vs base            │
LoggerEnabled-20   0.000 ± 0%   0.000 ± 0%  ~ (p=1.000 n=10) ¹
¹ all samples are equal

As we can see calling Enabled does not cause any heap allocation and has speed similar to emitting a "smallest" log record: #6315

@pellared pellared changed the title Move sdk/log/internal/x to sdk/xlog sdk/xlog: Add FilterProcessor and EnabledParameters Feb 5, 2025
@pellared pellared self-assigned this Feb 5, 2025
@pellared pellared added area:logs Part of OpenTelemetry logs pkg:SDK Related to an SDK package labels Feb 5, 2025
Copy link

codecov bot commented Feb 5, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 81.8%. Comparing base (d9ab149) to head (46ba89f).
Report is 2 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #6271   +/-   ##
=====================================
  Coverage   81.8%   81.8%           
=====================================
  Files        283     283           
  Lines      24892   24900    +8     
=====================================
+ Hits       20379   20389   +10     
+ Misses      4109    4107    -2     
  Partials     404     404           

see 4 files with indirect coverage changes

@pellared pellared marked this pull request as ready for review February 5, 2025 21:48
@dashpole
Copy link
Contributor

dashpole commented Feb 6, 2025

This seems like a good improvement over the previous approach. I don't see any drawbacks, and would be comfortable using this approach elsewhere

sdk/log/logger.go Outdated Show resolved Hide resolved
@dmathieu
Copy link
Member

dmathieu commented Feb 6, 2025

I like this approach. This is also how the collector does experimental changes in stable packages, for example with xreceiver.

@pellared
Copy link
Member Author

pellared commented Feb 6, 2025

I like this approach. This is also how the collector does experimental changes in stable packages, for example with xreceiver.

I see that xreceiver is under receiver. Maybe it is a good thing to do as it gives the xreceiver access to receiver/internal. Does my reasoning make sense? If so then probably the module should be sdk/log/xlog.

@dmathieu
Copy link
Member

dmathieu commented Feb 6, 2025

That definitely makes sense.

@pellared
Copy link
Member Author

pellared commented Feb 6, 2025

That definitely makes sense.

Done 4e54827

@pellared pellared changed the title sdk/xlog: Add FilterProcessor and EnabledParameters sdk/log/xlog: Add FilterProcessor and EnabledParameters Feb 6, 2025
sdk/log/xlog/go.mod Outdated Show resolved Hide resolved
@@ -10,6 +10,7 @@ require (
go.opentelemetry.io/otel v1.34.0
go.opentelemetry.io/otel/log v0.10.0
go.opentelemetry.io/otel/sdk v1.34.0
go.opentelemetry.io/otel/sdk/log/xlog v0.0.0-00010101000000-000000000000
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't this mean when this is stabilized, this module will depend on an unstable module we control?

Copy link
Member Author

@pellared pellared Feb 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but I think the impact would be acceptable.

This would require the users to use versions that are compatible. This would affect mostly users that opt-in to use xlog so that it is a direct dependency. FWIW most people do not bump indirect dependencies.

Personaly, I think it is an acceptable trade-off.

Similarly we depend on unstable github.com/google/go-cmp and golang.org/x/sys and I do not thing it prevents as from stabilization of the modules that use it.

I think it is good to not depend on unstable modules but it is not always feasible.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly we depend on unstable github.com/google/go-cmp and golang.org/x/sys and I do not thing it prevents as from stabilization of the modules that use it.

These are not maintained by us though.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would require the users to use versions that are compatible. This would affect mostly users that opt-in to use xlog so that it is a direct dependency. FWIW most people do not bump indirect dependencies.

I think this is the crux of the problem. We need to provide stability guarantees for the SDK we release. I don't think saying "people do not bump indirect dependencies" is a valid way to get around this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like it will share the same issue we had with cross-module internal dependencies, right?

Copy link
Member

@XSAM XSAM Feb 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If not then I think I can make a longer list of arguments for Enabled (context, severity, scope, resource) instead of EnabledPararamters (which could be introduced when the feature is going to be stabilized).

Are you suggesting we are going to use the same EnabledParameters for Logger.Enabled and FilterProcessor.Enabled? This will lead us to an awkward position if we need to change FilterProcessor.Enabled after stabilizing the Logger.

I think we can copy experimental interface definitions to the stable module if we want to make it stable. After that, we drop the use of experimental interfaces in SDK. This will break all existing processors that implement the experimental interfaces, but it seems fine, as they are experimental features.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you suggesting we are going to use the same EnabledParameters for Logger.Enabled and FilterProcessor.Enabled?

No, we need scope.Instrumentation and resource.Resource. I meant something like

type FiltererProcessor interface {
   Enabled(context.Context, log.Severity, scope.Instrumentation, resource.Resource) bool
}

Copy link
Member Author

@pellared pellared Feb 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can use tooling to copy experimental interface definitions to the stable module if we want to ensure syncing, but not having the to-be-stable module depend on the unstable module prevents a set of user bugs and is preferential.

As suspected. It does not work with type assertions. However, I think we can try use reflect instead to make it working.

The benefit of this approach is that it should never break the compilation for the users. It mitigates the problem if they happen to run in a conflict and have incompatible version of xlog (e.g. transient dependencies).

The main drawback for users is no compile-time guarantee for people that opt-in in to an experimental feature.
The other drawback is the complexity of the implementation (on our side) because of using reflection.
The minor drawbacks are a little more overhead

It seems reasonable to favor stability guarantees. It also looks like a safer, less disruptive approach for our users. For sure it is worth creating a PR that tries doing it so that we can compare both approaches.

The next alternative is to have an experimental interface like

type FiltererProcessor interface {
   Enabled(context.Context, log.Severity, scope.Instrumentation, resource.Resource) bool
}

This way we would be able to use type assertions, but I think it is a bad idea given we would like to introduce EnabledParameters. It would make migration from experimental to stable very awkward and inconvenient. I do not like this and I do not want to go forward. I think this would look very hacky and unprofessional. At last we would not be able to use such approach for more complex parameters.

Copy link
Member Author

@pellared pellared Feb 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#6286 is ready for review.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the performance overhead in the other PR is less acceptable than the potential build failures that users can run into.

versions.yaml Outdated Show resolved Hide resolved
@dmathieu
Copy link
Member

dmathieu commented Feb 6, 2025

From the SIG meeting: could we add a check (maybe a lint) to ensure no SDK uses an x module?
The reasoning is that if we make a breaking change to an x module which is used by the SDK, folks have to upgrade everything in lockstep.

If we ensure no stable SDK uses the x modules, the only folks who have to upgrade in lockstep are folks who made the conscious choice to use an unstable module.

Copy link
Member

@XSAM XSAM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the current approach. It looks promising.

sdk/log/logger.go Outdated Show resolved Hide resolved
sdk/log/xlog/filter_processor.go Outdated Show resolved Hide resolved
sdk/log/example_test.go Outdated Show resolved Hide resolved
@@ -10,6 +10,7 @@ require (
go.opentelemetry.io/otel v1.34.0
go.opentelemetry.io/otel/log v0.10.0
go.opentelemetry.io/otel/sdk v1.34.0
go.opentelemetry.io/otel/sdk/log/xlog v0.0.0-00010101000000-000000000000
Copy link
Member

@XSAM XSAM Feb 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If not then I think I can make a longer list of arguments for Enabled (context, severity, scope, resource) instead of EnabledPararamters (which could be introduced when the feature is going to be stabilized).

Are you suggesting we are going to use the same EnabledParameters for Logger.Enabled and FilterProcessor.Enabled? This will lead us to an awkward position if we need to change FilterProcessor.Enabled after stabilizing the Logger.

I think we can copy experimental interface definitions to the stable module if we want to make it stable. After that, we drop the use of experimental interfaces in SDK. This will break all existing processors that implement the experimental interfaces, but it seems fine, as they are experimental features.

@XSAM
Copy link
Member

XSAM commented Feb 6, 2025

From the SIG meeting: could we add a check (maybe a lint) to ensure no SDK uses an x module? The reasoning is that if we make a breaking change to an x module which is used by the SDK, folks have to upgrade everything in lockstep.

If we ensure no stable SDK uses the x modules, the only folks who have to upgrade in lockstep are folks who made the conscious choice to use an unstable module.

This PR already makes SDK to use an x module to make experimental features happen. Unless we use a shared type (like shared EnabledParameters, maybe it can be any or map) in SDK for x modules, I am not sure we can implement experimental features without importing them.

@pellared pellared changed the title sdk/log/xlog: Add FilterProcessor and EnabledParameters [prior art] sdk/log/xlog: Add FilterProcessor and EnabledParameters Feb 7, 2025
@pellared pellared marked this pull request as draft February 7, 2025 13:22
@pellared pellared closed this Feb 12, 2025
@pellared pellared reopened this Feb 13, 2025
@pellared pellared changed the title [prior art] sdk/log/xlog: Add FilterProcessor and EnabledParameters [dependency] sdk/log/xlog: Add FilterProcessor and EnabledParameters Feb 13, 2025
@pellared
Copy link
Member Author

pellared commented Feb 13, 2025

I reopened this PR because of the huge performance overhead (speed and 16 allocs) of #6286.

@pellared pellared marked this pull request as ready for review February 13, 2025 13:19
@pellared
Copy link
Member Author

SIG meeting notes:
We agreed that we can move FilterProcessor directly to sdk/log as Logs SDK does not look to be stabilized soon.

@pellared pellared closed this Feb 13, 2025
pellared added a commit that referenced this pull request Feb 18, 2025
Per
#6271 (comment)

> We agreed that we can move `FilterProcessor` directly to `sdk/log` as
Logs SDK does not look to be stabilized soon.

- Add the possibility to filter based on the resource and scope which is
available for the SDK. The scope information is the most important as it
gives the possibility to e.g. filter out logs emitted for a given
logger. Thus e.g.
open-telemetry/opentelemetry-specification#4364
is not necessary. See
open-telemetry/opentelemetry-specification#4290 (comment)
for more context.
- It is going be an example for
open-telemetry/opentelemetry-specification#4363

There is a little overhead (IMO totally acceptable) because of data
transformation. Most importantly, there is no new heap allocation.

```
goos: linux
goarch: amd64
pkg: go.opentelemetry.io/otel/sdk/log
cpu: 13th Gen Intel(R) Core(TM) i7-13800H
                 │   old.txt   │                 new.txt                  │
                 │   sec/op    │     sec/op      vs base                  │
LoggerEnabled-20   4.589n ± 1%   319.750n ± 16%  +6867.75% (p=0.000 n=10)

                 │   old.txt    │             new.txt             │
                 │     B/op     │     B/op       vs base          │
LoggerEnabled-20   0.000Ki ± 0%   1.093Ki ± 13%  ? (p=0.000 n=10)

                 │  old.txt   │            new.txt             │
                 │ allocs/op  │ allocs/op   vs base            │
LoggerEnabled-20   0.000 ± 0%   0.000 ± 0%  ~ (p=1.000 n=10) ¹
¹ all samples are equal
```

`Logger.Enabled` is still more efficient than `Logger.Emit` (benchmarks
from #6315).

```
goos: linux
goarch: amd64
pkg: go.opentelemetry.io/otel/sdk/log
cpu: 13th Gen Intel(R) Core(TM) i7-13800H
BenchmarkLoggerEmit/5_attributes-20               559934              2391 ns/op           39088 B/op          1 allocs/op
BenchmarkLoggerEmit/10_attributes-20             1000000              5910 ns/op           49483 B/op          5 allocs/op
BenchmarkLoggerEnabled-20                        1605697               968.7 ns/op          1272 B/op          0 allocs/op
PASS
ok      go.opentelemetry.io/otel/sdk/log        10.789s
```

Prior art:
- #6271
- #6286

I also created for tracking purposes:
- #6328
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:logs Part of OpenTelemetry logs pkg:SDK Related to an SDK package
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

5 participants